Skip to content

feat(link): add container import UI with drag-drop decryption (task 42)#155

Merged
mpiton merged 7 commits intomainfrom
feat/task-42-import-containers-ui
May 7, 2026
Merged

feat(link): add container import UI with drag-drop decryption (task 42)#155
mpiton merged 7 commits intomainfrom
feat/task-42-import-containers-ui

Conversation

@mpiton
Copy link
Copy Markdown
Owner

@mpiton mpiton commented May 6, 2026

Summary

Enables Link Grabber to import container files (DLC, CCF, RSDF, Metalink, Meta4) via drag-and-drop, delegating decryption to the vortex-mod-containers plugin. Extracted URLs are batched into a single link resolution call. Closes task 42 from prd-v2-roadmap.

Why

Vortex users need to import container files from sites like JDownloader archives and direct download services. The container format ecosystem is handled by plugin infrastructure (Extism WASM), so the UI just needs to accept files and orchestrate the decryption flow without reimplementing format logic.

Changes

  • Add LinkGrabberView.handleContainerFiles() to accept dropped container files and dispatch to link_import_container IPC command
  • Add PasteZone.handleDrop() container detection: files with extensions .dlc, .ccf, .rsdf, .metalink, .meta4 route to onContainerFiles callback
  • Export isContainerFile() and CONTAINER_EXTENSIONS from PasteZone for component reuse
  • Create src/types/container.ts with ImportContainerResult interface (format, urls, packageId, packageName)
  • Add Tauri IPC handler link_import_container that calls CommandBus::handle_import_container
  • Implement CommandBus::handle_import_container: validates extension and file size, calls plugin to decrypt, extracts URLs, creates package, returns outcome
  • Implement PluginLoader::decrypt_container in ExtismPluginLoader: scans registry for enabled Container plugin exporting "decrypt", invokes it, maps errors
  • Add trait method to PluginLoader port with default NotFound return
  • Batch URL resolution: handleContainerFiles collects all URLs from all imported containers, calls resolveLinks() once instead of per-container
  • Add i18n keys for container import toast messages (English + French)
  • Refactor PluginRegistry::call_plugin and call_plugin_bytes: extract generic call_plugin_inner<I: ToBytes> helper to eliminate ~25 lines of duplicated lock/poison-check logic
  • Drop redundant file_name field from ImportContainerOutcome (always equals package_name)
  • Remove defensive MAX_CONTAINER_BYTES check from IPC layer (Tauri deserializes before handler; validation at handler layer is single source of truth)
  • Extract test fixture helpers: Fixture struct + fixture() / stub_fixture() functions to reduce ~40 lines of repeated setup
  • Trim WHAT-comments explaining obvious code behavior; keep single-line WHY comments for non-obvious logic
  • Change tracing::info! to tracing::debug! in plugin registry per-call logs to reduce noise

Testing

  • cargo test --workspace: 1493 passed, 7 ignored
  • cargo clippy --workspace -- -D warnings: clean
  • npx vitest run: 702 passed across 85 test files
  • oxlint . && tsc -b: clean
  • Added 4 new tests in PasteZone.test.tsx for container file detection and callback forwarding
  • Verified container import flow: IPC call → plugin decrypt → URL batch resolve → toast confirmation

Related Issues

  • Closes task 42 (prd-v2-roadmap)

Checklist

  • Tests added and passing locally
  • No secrets, debug prints, or commented-out code
  • Self-reviewed the diff (agents reviewed for reuse, quality, efficiency)
  • Cargo clippy and tsc -b clean
  • Vitest full suite passing
  • Two-commit structure: feature + simplification

Summary by cubic

Adds drag‑and‑drop import for .dlc/.ccf/.rsdf/.metalink/.meta4 in Link Grabber, decrypted via vortex-mod-containers. Resolves and checks URLs in 500‑URL chunks with guards against stale updates; failure toasts now show the backend reason. Satisfies Linear task 42.

  • New Features

    • IPC link_import_container(fileName, fileBytes): validates extension and 1 MiB cap, decrypts via PluginLoader::decrypt_container, trims URLs, creates a Container package, returns format, urls, packageId, packageName; missing plugin maps to an install hint (includes .meta4); probe failures surface as PluginError.
    • ExtismPluginLoader adds decrypt_container: selects the first enabled Container plugin exposing decrypt and calls it via PluginRegistry::call_plugin_bytes; returns NotFound when none loaded.
    • UI: PasteZone exports CONTAINER_EXTENSIONS/isContainerFile and forwards to onContainerFiles; LinkGrabberView pre‑checks 1 MiB before reading, shows toasts (failure includes reason), aggregates URLs, then resolves in 500‑URL chunks and applies a single merged result. Online checks and duplicate detection are chunked too, stale completions are guarded with a resolveBatchRef, per‑chunk error fallbacks are batch‑scoped, and overlapping imports are blocked with isImportingContainers.
  • Refactors

    • Extracted PluginRegistry::call_plugin_inner shared by call_plugin/call_plugin_bytes with per‑call logs at debug; moved the IPC result type to src/types/container.ts.

Written for commit ae1f591. Summary will update on new commits.

Summary by CodeRabbit

  • New Features
    • Drag-and-drop container import: import container files (e.g., .dlc) to extract links, create packages, and feed them into the existing resolve pipeline.
  • UI / i18n
    • Per-file success/failure toasts added (English & French); large imports are staged and reported per-file.
  • Behavior
    • Container files are detected and routed separately from pasted text; imports enforce size/extension limits and chunk/dedupe/online-check workflows.
  • Documentation
    • CHANGELOG expanded with Task 42 and plugin/container import details.
  • Tests
    • Added unit, integration, and frontend tests covering success and failure scenarios.

mpiton added 2 commits May 6, 2026 15:36
Drag-drop .dlc/.ccf/.rsdf/.metalink/.meta4 into Link Grabber paste
zone now decrypts via vortex-mod-containers plugin (task 41) and
feeds extracted URLs into the existing resolve / online-check /
duplicate-detect pipeline. Auto-creates a `source_type=Container`
package per imported file.

New IPC `link_import_container(file_name, file_bytes)` validates
extension + caps payload at 1 MiB, calls the new
`PluginLoader::decrypt_container(bytes)` port, parses the plugin
JSON, and persists the container package. The Extism adapter scans
for the first enabled `Container`-category plugin exporting
`decrypt` and ships raw bytes via the new
`PluginRegistry::call_plugin_bytes` helper (existing
`call_plugin` only takes `&str`, would corrupt non-UTF-8 blobs).

`PasteZone` now exports `CONTAINER_EXTENSIONS` + `isContainerFile`
predicate and surfaces dropped containers through a dedicated
`onContainerFiles(File[])` callback rather than synthesising fake
`container:<name>` URLs that LinkGrabberView used to drop on the
floor (the original `LinkGrabberView.tsx:67` TODO).

Container password protection is wired in spec but vacuously
satisfied: vortex-mod-containers v1.0 uses fixed historic AES keys
per ADR-001 and none of the four supported formats has a per-file
password layer. CCF v2 + DLC v3 service-fetch are deferred to
v1.1, so no `password_required` state can flow through
`decrypt_container` until the plugin gains the capability.

11 new tests:
- 8 backend (`import_container::tests`): golden path, validation
  rejections (blank/extension/empty/oversize), plugin NotFound,
  zero-link response, invalid JSON
- 1 port default test
- 1 adapter "no container plugin loaded" test
- 4 frontend `PasteZone.test.tsx` cases (drop callback,
  callback-missing fallback, text-only drop, isContainerFile)
- 2 frontend `LinkGrabberView.test.tsx` cases (drop → import_container
  → resolve chain ; IPC failure → toast.error)

cargo test --workspace: 1493 pass / 7 ignored
cargo clippy + cargo fmt clean
vitest run: 702 pass
oxlint + tsc -b clean
- Generic `PluginRegistry::call_plugin_inner` shared by `call_plugin`
  and `call_plugin_bytes`; drops ~25 duplicated lines.
- `tracing::info!` → `debug!` on plugin call (per-call hot path).
- Drop redundant `MAX_CONTAINER_BYTES` check in IPC layer; Tauri has
  already deserialized the buffer before the function body runs, so
  the "defensive cap" was a dead branch.
- `handle_import_container` delegates package creation to the
  existing `handle_create_package` instead of inlining UUID mint +
  `Package::new` + `repo.save` + event publish.
- Drop redundant `file_name` field from `ImportContainerOutcome` /
  `ImportContainerResultDto` (always equal to `package_name`).
- Move inline IPC type to `src/types/container.ts` per project
  convention.
- Frontend batches all extracted URLs into a single `link_resolve`
  call after the import loop instead of one IPC per container.
- Test fixture sprawl collapsed into `Fixture` / `fixture(loader)` /
  `stub_fixture()` helpers (~40 lines saved).
- Trim WHAT comments (4 sites): `LinkGrabberView::handleContainerFiles`,
  `PasteZone::handleDrop`, `import_container::MAX_CONTAINER_BYTES`,
  `extism_loader::decrypt_container`. Keep WHY-only one-liners.
- Drop "(task 41)" reference in `import_container` module doc; the
  CHANGELOG carries that lineage.

cargo test --workspace: 1493 pass / 7 ignored
cargo clippy + cargo fmt clean
vitest run: 702 pass
oxlint + tsc -b clean

Net: -173 / +98 lines.
@github-actions github-actions Bot added documentation Improvements or additions to documentation rust frontend labels May 6, 2026
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented May 6, 2026

Review Change Stack

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

Adds end-to-end container import: plugin byte-call support, PluginLoader decrypt probing, CommandBus handler to parse decrypted metalink-like JSON and create a container package, Tauri IPC command, frontend PasteZone and LinkGrabber wiring, types/i18n, and tests.

Changes

Container Import Feature

Layer / File(s) Summary
Plugin registry: binary-safe call path
src-tauri/src/adapters/driven/plugin/registry.rs
Adds a generic inner plugin call and call_plugin_bytes wrapper to invoke plugins with raw bytes.
PluginLoader port
src-tauri/src/domain/ports/driven/plugin_loader.rs
Adds decrypt_container(&self, bytes: &[u8]) to the trait with a default DomainError::NotFound implementation and a unit test.
Extism plugin loader
src-tauri/src/adapters/driven/plugin/extism_loader.rs
Implements decrypt_container by probing enabled container-category plugins (sorted) for decrypt, returning first successful result; adds NotFound test.
Command defs & exports
src-tauri/src/application/commands/mod.rs
Adds ImportContainerCommand struct, module declaration, and re-exports ImportContainerOutcome.
Command handler & validation
src-tauri/src/application/commands/import_container.rs
Implements handle_import_container: validates filename/extension/size, calls plugin decrypt, parses metalink-like JSON, extracts URLs, creates a container-sourced Package, returns ImportContainerOutcome; includes comprehensive tests.
Test support
src-tauri/src/application/commands/tests_support.rs
Adds build_package_bus_with_plugin_loader and adjusts build_package_bus to allow injecting custom plugin loaders in tests.
Tauri IPC adapter
src-tauri/src/adapters/driving/tauri_ipc.rs
Adds link_import_container command, ImportContainerResultDto mapping, error-to-message mapping, and includes command types in adapter imports.
Public IPC exports
src-tauri/src/lib.rs
Adds many IPC command re-exports including link_import_container and updates tauri::generate_handler!.
Types & localization
src/types/container.ts, src/i18n/locales/en.json, src/i18n/locales/fr.json
Adds ImportContainerResult TypeScript interface and three i18n keys (singular/plural success, failure) in English and French.
PasteZone: container detection & prop
src/views/LinkGrabberView/PasteZone.tsx
Exports CONTAINER_EXTENSIONS, isContainerFile, adds onContainerFiles?: (files: File[]) => void, and forwards container drops to onContainerFiles.
LinkGrabberView UI wiring
src/views/LinkGrabberView/LinkGrabberView.tsx
Adds handleContainerFiles to read file bytes, call link_import_container, toast per-file outcomes, aggregate returned URLs and resolve them (chunked), and wires onContainerFiles into PasteZone.
Frontend tests
src/views/LinkGrabberView/__tests__/*
Adds/updates tests for container-drop forwarding, container detection, LinkGrabberView DLC import success and failure paths; updates testing imports.
CHANGELOG
CHANGELOG.md
Adds Task 42 entry documenting the container import UI and vortex-mod-containers plugin v1.0.0 details.

Sequence Diagram

sequenceDiagram
    actor User
    participant FrontendUI as Frontend UI
    participant PasteZone
    participant TauriIPC as Tauri IPC
    participant CommandBus
    participant PluginLoader
    participant PackageRepo as Package Repository
    participant EventBus

    User->>PasteZone: Drop container file
    PasteZone->>PasteZone: isContainerFile() check
    PasteZone->>FrontendUI: onContainerFiles([file])
    FrontendUI->>FrontendUI: Read file bytes
    FrontendUI->>TauriIPC: link_import_container(name, bytes)
    TauriIPC->>CommandBus: handle_import_container(cmd)
    CommandBus->>CommandBus: Validate extension & size
    CommandBus->>PluginLoader: decrypt_container(bytes)
    PluginLoader->>PluginLoader: Probe container plugins in order
    PluginLoader-->>CommandBus: Decrypted JSON (metalink)
    CommandBus->>CommandBus: Parse response, extract URLs
    CommandBus->>PackageRepo: Create package (source=Container)
    PackageRepo-->>CommandBus: package_id
    CommandBus->>EventBus: PackageCreated event
    CommandBus-->>TauriIPC: ImportContainerOutcome
    TauriIPC-->>FrontendUI: ImportContainerResultDto
    FrontendUI->>FrontendUI: Show toast and dispatch resolveLinks
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~35 minutes

Possibly related PRs

  • mpiton/vortex#19: Main PR builds directly on and modifies the Link Grabber/PasteZone functionality introduced in the retrieved PR (changing container-file handling and adding a new IPC/command path for container import), so they are related.
  • mpiton/vortex#145: Both PRs touch the Link Grabber duplicate-detection/resolve flow (LinkGrabberView wiring, detectBatchRef/duplicate-chunking logic and related IPC handling), so the changes are directly related at the code level.
  • mpiton/vortex#110: Both PRs modify the plugin-loader surface and Extism plugin adapter (adding complementary methods: retrieved PR adds find_installed_manifest while the main PR adds decrypt_container and call_plugin_bytes), so they touch the same plugin loader/adapter code paths and are related.

Suggested labels

ui

Poem

🐰 I found a tiny .dlc in the hay,
The plugin hummed softly and whisked links away,
URLs hopped out, neat and spry,
Resolved and de-duped beneath the sky,
A rabbit’s small victory — hip-hip-hooray!

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately captures the main objective: adding container import UI with drag-and-drop decryption support, directly matching the core feature introduced across the pull request.
Docstring Coverage ✅ Passed Docstring coverage is 90.00% which is sufficient. The required threshold is 80.00%.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feat/task-42-import-containers-ui

Tip

💬 Introducing Slack Agent: The best way for teams to turn conversations into code.

Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.

  • Generate code and open pull requests
  • Plan features and break down work
  • Investigate incidents and troubleshoot customer tickets together
  • Automate recurring tasks and respond to alerts with triggers
  • Summarize progress and report instantly

Built for teams:

  • Shared memory across your entire org—no repeating context
  • Per-thread sandboxes to safely plan and execute work
  • Governance built-in—scoped access, auditability, and budget controls

One agent for your entire SDLC. Right inside Slack.

👉 Get started


Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 6be272d7b0

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +213 to +216
defaultValue: `Imported ${result.urls.length} links from ${result.packageName}`,
}),
);
aggregatedUrls.push(...result.urls);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Preserve container package mapping for started downloads

link_import_container returns packageId, but this handler drops that association and only accumulates raw URLs; later starts use the normal download_start flow, so the container package created on import is never auto-attached to those downloads. In practice, importing a container creates an empty package entry and users lose the “import as one package” behavior the backend command is trying to provide.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chatgpt-codex-connector

link_import_container returns packageId, but this handler drops that association and only accumulates raw URLs

Acknowledged but out of scope for this PR. Carrying the packageId end-to-end requires plumbing it through link_resolve and download_start (and the package read-side) so created downloads attach to the import package — that's a multi-handler, schema-touching change. Deferring to a follow-up task; this PR's package row is intentionally a placeholder until that pipe exists.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Understood. I reviewed the trigger and PR context, and this specific feedback is already explicitly deferred by the PR author as a multi-handler, schema-touching follow-up (propagating packageId through link_resolve and download_start, plus package read-side wiring).

Given the current code shape, that’s accurate: download_start currently accepts only url, destination, and optional metadata—not a package association—so implementing this here would be broader than a targeted PR follow-up fix and would change command/API contracts across frontend + Tauri + application layers.

So for this follow-up action, I made no code changes and therefore did not create a commit or PR.

Checks run

  • git status --short
  • git rev-parse --abbrev-ref HEAD
  • sed -n '1,280p' src/views/LinkGrabberView/LinkGrabberView.tsx
  • sed -n '70,150p' src-tauri/src/adapters/driving/tauri_ipc.rs
  • rg "download_start" -n src src-tauri | head -n 40

View task →

Comment on lines +226 to +227
if (aggregatedUrls.length > 0) {
resolveLinks({ urls: aggregatedUrls });
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Split aggregated container URLs into bounded resolve batches

This sends all extracted URLs in a single link_resolve call, but the backend resolver rejects payloads above 500 URLs (handle_resolve_links hard cap). A realistic multi-file or large container import can exceed that limit, causing the whole resolve step to fail after successful decrypt/import toasts, leaving users with no resolved rows.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chatgpt-codex-connector

A realistic multi-file or large container import can exceed that limit, causing the whole resolve step to fail after successful decrypt/import toasts.

Fixed in 77c6ff4handleContainerFiles now slices the aggregated URL list into 500-URL chunks (matches MAX_URLS in handle_resolve_links) and dispatches one resolveLinks mutation per chunk. Multi-container drops above the cap no longer fail the whole batch.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary

  • I reviewed the trigger and current branch state; the reported resolver-cap issue is already fixed in commit 77c6ff4, and handleContainerFiles now batches container URLs into bounded resolveLinks calls using slices rather than one oversized request. No additional code changes were needed. src/views/LinkGrabberView/LinkGrabberView.tsxL238-L240
  • I did not create a new commit or PR because the workspace had no new modifications to commit after verification.

Testing / Checks

  • git status --short && git log --oneline -n 5 && rg -n "MAX_URLS|chunk|resolveLinks\\(" src/views/LinkGrabberView/LinkGrabberView.tsx
  • nl -ba src/views/LinkGrabberView/LinkGrabberView.tsx | sed -n '220,255p'

View task →

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🧹 Nitpick comments (1)
src/types/container.ts (1)

1-5: ⚡ Quick win

Narrow format to the backend’s closed set.

ImportContainerResult is a new IPC-facing type, and the backend only emits a fixed set of container formats. Keeping format: string drops exhaustiveness checks and makes format typos invisible in the UI/tests.

💡 Suggested change
+export type ContainerFormat = "dlc" | "ccf" | "rsdf" | "metalink";
+
 export interface ImportContainerResult {
-  format: string;
+  format: ContainerFormat;
   urls: string[];
   packageId: string;
   packageName: string;
 }
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/types/container.ts` around lines 1 - 5, ImportContainerResult currently
types the field format as string which prevents exhaustiveness checks; change
format in the ImportContainerResult interface to a closed string union matching
the backend’s allowed container formats (replace format: string with format: 'X'
| 'Y' | ...), update any code that constructs or switches on
ImportContainerResult.format to use the new union values, and adjust
tests/typeguards to handle the explicit variants so typos become compile-time
errors; reference the ImportContainerResult interface and any switch/if branches
that inspect .format when making the change.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@CHANGELOG.md`:
- Line 35: Update the headline "11 new tests" to match the enumerated breakdown
by changing it to "16 new tests" (since the list sums to 8 backend + 1 port + 1
adapter + 4 frontend PasteZone + 2 frontend LinkGrabberView = 16), or
alternatively adjust the per-section counts to total 11 if that was intended;
edit the changelog sentence containing "11 new tests" and ensure the numbers in
the parenthetical breakdown (the "8 backend (...) , 1 port ..., 1 adapter ..., 4
frontend PasteZone ..., 2 frontend LinkGrabberView ...") are consistent with the
headline.

In `@src-tauri/src/adapters/driven/plugin/extism_loader.rs`:
- Around line 424-445: The loop currently treats errors from
self.registry.function_exists(info.name(), "decrypt") as mere warnings and
ultimately returns DomainError::NotFound, which hides plugin probe failures;
change the logic in the decrypt lookup loop so that if function_exists returns
Err(e) you return Err(DomainError::PluginError(...)) (include info.name() and
the error e in the message) instead of continuing, while keeping the existing
behavior of calling self.registry.call_plugin_bytes(info.name(), "decrypt",
bytes) when function_exists is Ok(true); ensure any call_plugin_bytes errors are
still mapped to DomainError::PluginError as before.

In `@src-tauri/src/adapters/driving/tauri_ipc.rs`:
- Around line 1014-1016: Update the fallback message in the
AppError::Domain(DomainError::NotFound(_)) match arm in tauri_ipc.rs to include
the missing ".meta4" extension so users see the full list of supported container
formats; locate the match arm that currently returns "Install
vortex-mod-containers to import .dlc/.ccf/.rsdf/.metalink files" and add
".meta4" into that string (e.g., include ".meta4" among the extensions) so the
guidance is complete.

In `@src-tauri/src/application/commands/import_container.rs`:
- Around line 71-76: The collected URLs currently map from response.links ->
l.url and only check u.trim().is_empty(), leaving surrounding whitespace in the
stored strings; update the pipeline that builds urls (the mapping from
response.links and the variable urls) to trim each l.url (e.g., produce a
trimmed String) before collecting and then filter out empty trimmed strings so
urls contains clean, whitespace-free URLs for later resolve_batch.

In `@src-tauri/src/domain/ports/driven/plugin_loader.rs`:
- Around line 139-150: The docstring for container decryption is incorrect about
precedence—ExtismPluginLoader::decrypt_container() sorts enabled Container
plugins by name before probing, so the winner is the alphabetically-first
enabled container plugin, not the "first loaded"; update the comment on the
decrypt_container() port (referencing Container category and the decrypt
function) to state that enabled container plugins are tried in alphabetical
order by plugin name (or alternatively remove the name-sorting in
ExtismPluginLoader::decrypt_container() if you prefer true load-order
semantics)—make the doc and code behavior consistent and mention the exact
precedence rule used.

In `@src/views/LinkGrabberView/LinkGrabberView.tsx`:
- Around line 205-228: The current container import flow calls
invoke("link_import_container") and throws away result.packageId, only
aggregating result.urls into aggregatedUrls and calling resolveLinks({ urls:
aggregatedUrls }); Update this flow to carry the originating packageId through
the resolution/download pipeline: when handling the
invoke("link_import_container") response, push an object containing
result.packageId and result.urls (or map urls -> { url, packageId }) into the
aggregate instead of plain strings, then change resolveLinks (and downstream
download_start calls) to accept and propagate an optional packageId parameter so
that the server-side download_start can attach created download IDs to the
provided packageId; alternatively, defer creating the package on import and
create it when download IDs exist, but ensure either resolveLinks or
download_start accepts and forwards result.packageId (from
invoke("link_import_container")) so imported links are associated with the
package rather than creating empty packages.
- Around line 203-208: The code in LinkGrabberView.tsx currently calls
file.arrayBuffer() and builds bytes before sending to
invoke("link_import_container"), which allocates the entire file in memory; add
a pre-check on file.size against the backend container size limit (define a
constant e.g. CONTAINER_MAX_BYTES matching the backend cap) and reject oversized
files immediately (show an error/notification and return) before calling
file.arrayBuffer(); keep the rest of the flow (Uint8Array -> Array.from ->
invoke<ImportContainerResult>) unchanged so only files within the limit are read
into memory.

---

Nitpick comments:
In `@src/types/container.ts`:
- Around line 1-5: ImportContainerResult currently types the field format as
string which prevents exhaustiveness checks; change format in the
ImportContainerResult interface to a closed string union matching the backend’s
allowed container formats (replace format: string with format: 'X' | 'Y' | ...),
update any code that constructs or switches on ImportContainerResult.format to
use the new union values, and adjust tests/typeguards to handle the explicit
variants so typos become compile-time errors; reference the
ImportContainerResult interface and any switch/if branches that inspect .format
when making the change.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 39b5d7b4-f59d-45aa-b741-2535b5d7e918

📥 Commits

Reviewing files that changed from the base of the PR and between c00d74b and 6be272d.

📒 Files selected for processing (16)
  • CHANGELOG.md
  • src-tauri/src/adapters/driven/plugin/extism_loader.rs
  • src-tauri/src/adapters/driven/plugin/registry.rs
  • src-tauri/src/adapters/driving/tauri_ipc.rs
  • src-tauri/src/application/commands/import_container.rs
  • src-tauri/src/application/commands/mod.rs
  • src-tauri/src/application/commands/tests_support.rs
  • src-tauri/src/domain/ports/driven/plugin_loader.rs
  • src-tauri/src/lib.rs
  • src/i18n/locales/en.json
  • src/i18n/locales/fr.json
  • src/types/container.ts
  • src/views/LinkGrabberView/LinkGrabberView.tsx
  • src/views/LinkGrabberView/PasteZone.tsx
  • src/views/LinkGrabberView/__tests__/LinkGrabberView.test.tsx
  • src/views/LinkGrabberView/__tests__/PasteZone.test.tsx

Comment thread CHANGELOG.md Outdated
Comment thread src-tauri/src/adapters/driven/plugin/extism_loader.rs
Comment thread src-tauri/src/adapters/driving/tauri_ipc.rs
Comment thread src-tauri/src/application/commands/import_container.rs
Comment thread src-tauri/src/domain/ports/driven/plugin_loader.rs
Comment thread src/views/LinkGrabberView/LinkGrabberView.tsx Outdated
Comment on lines +205 to +228
const result = await invoke<ImportContainerResult>("link_import_container", {
fileName: file.name,
fileBytes: bytes,
});
toast.success(
t("linkGrabber.toast.containerImported", {
count: result.urls.length,
fileName: result.packageName,
defaultValue: `Imported ${result.urls.length} links from ${result.packageName}`,
}),
);
aggregatedUrls.push(...result.urls);
} catch (err) {
toast.error(
t("linkGrabber.toast.containerImportFailed", {
fileName: file.name,
defaultValue: `Could not import ${file.name}: ${String(err)}`,
}),
);
}
}
if (aggregatedUrls.length > 0) {
resolveLinks({ urls: aggregatedUrls });
}
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot May 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | 🏗️ Heavy lift

Imported container packages are never populated.

link_import_container returns a packageId, but this flow discards it and only forwards raw URLs into the existing resolve/start path. In this file, downloads are still started later with plain download_start, so every successful container import creates an empty package record with no way for the imported downloads to land in it.

Please carry the originating packageId through resolve/start and attach created download IDs to that package, or defer package creation until the point where download IDs exist.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/views/LinkGrabberView/LinkGrabberView.tsx` around lines 205 - 228, The
current container import flow calls invoke("link_import_container") and throws
away result.packageId, only aggregating result.urls into aggregatedUrls and
calling resolveLinks({ urls: aggregatedUrls }); Update this flow to carry the
originating packageId through the resolution/download pipeline: when handling
the invoke("link_import_container") response, push an object containing
result.packageId and result.urls (or map urls -> { url, packageId }) into the
aggregate instead of plain strings, then change resolveLinks (and downstream
download_start calls) to accept and propagate an optional packageId parameter so
that the server-side download_start can attach created download IDs to the
provided packageId; alternatively, defer creating the package on import and
create it when download IDs exist, but ensure either resolveLinks or
download_start accepts and forwards result.packageId (from
invoke("link_import_container")) so imported links are associated with the
package rather than creating empty packages.

Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@coderabbitai

link_import_container returns a packageId, but this flow discards it and only forwards raw URLs into the existing resolve/start path.

Same root cause as the codex P1 finding above. Acknowledged but deferred — propagating packageId through link_resolve + download_start so created downloads land in the import package is a multi-handler pipeline change, tracked as a follow-up. The placeholder package row stays for now; this PR scope is the import + decrypt path only.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mpiton, understood — the placeholder package being created ahead of its downloads is a known trade-off for this PR's scope. Deferring the packageId propagation through link_resolve + download_start makes sense given the pipeline scope of the change.

Would you like me to open a GitHub follow-up issue to track the work of propagating packageId so that imported container downloads land in the correct package rather than leaving an empty package record?


🧠 Learnings used
Learnt from: mpiton
Repo: mpiton/vortex PR: 127
File: src/views/AccountsView/ImportExportDialog.tsx:38-49
Timestamp: 2026-04-29T07:21:55.709Z
Learning: In mpiton/vortex, do not flag usages like `React.FormEvent` (and other `React.*` type references) for missing `import React` or `import type { ... }` as long as this codebase’s TypeScript setup provides the `React` namespace globally (i.e., `types/react` global namespace is relied upon) and `tsc --noEmit` is clean with `

Learnt from: mpiton
Repo: mpiton/vortex PR: 127
File: src/views/AccountsView/AddAccountDialog.tsx:65-65
Timestamp: 2026-04-29T07:25:31.488Z
Learning: In mpiton/vortex, do not report a missing import for type references written as `React.*` (e.g., `React.FormEvent`, `React.ChangeEvent`, `React.MouseEvent`) in TSX files. The project relies on `types/react` to provide the `React` namespace globally (with `tsconfig` using `"jsx": "react-jsx"` and `"strict": true`, and `tsc --noEmit` passing cleanly). Only raise an issue if `types/react` is removed from the project (i.e., the global `React` namespace is no longer available).

@codspeed-hq
Copy link
Copy Markdown
Contributor

codspeed-hq Bot commented May 6, 2026

Merging this PR will degrade performance by 18.88%

⚡ 2 improved benchmarks
❌ 6 (👁 6) regressed benchmarks
✅ 18 untouched benchmarks

Performance Changes

Benchmark BASE HEAD Efficiency
👁 ftp_scheme 2.6 µs 2.8 µs -10.09%
👁 simple_https 2.4 µs 2.7 µs -11.51%
normalize_link_check_parallelism 150 ns 120.8 ns +24.14%
👁 with_userinfo 2.6 µs 3.1 µs -16.5%
normalize_max_concurrent 150 ns 120.8 ns +24.14%
👁 complex_with_port_and_path 2.6 µs 3.2 µs -16.04%
👁 from_status_code_404 154.7 ns 183.9 ns -15.86%
👁 from_status_code_500 125.3 ns 154.4 ns -18.88%

Comparing feat/task-42-import-containers-ui (ae1f591) with main (c00d74b)

Open in CodSpeed

- Pre-check container file size before reading into memory (avoid wasting RAM on oversized drops)
- Batch aggregated container URLs into 500-URL chunks so large imports don't trip the resolve cap
- Surface container plugin probe failures as PluginError instead of NotFound (preserves load-failure context)
- Trim whitespace from decrypted URLs before collecting
- Append `.meta4` to install-hint message
- Document alphabetical precedence rule on `decrypt_container` port (matches adapter behavior)
- Fix CHANGELOG test count: 11 -> 16
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 77c6ff4314

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread src/views/LinkGrabberView/LinkGrabberView.tsx Outdated
Each `link_resolve` mutation onSuccess called `setResolvedLinks(resolved)` and
overwrote the previous chunk's response. With >500 URLs the user only saw the
last chunk. Extract the post-resolve state mutation as `applyResolvedBatch` and,
on the chunked path, dispatch resolves directly via `invoke`, accumulate the
results, then apply the merged batch once.
@mpiton
Copy link
Copy Markdown
Owner Author

mpiton commented May 7, 2026

On the CodSpeed regression report: this PR doesn't touch url_parsing, link_status, or config_operations code paths, and the report itself flags "Different runtime environments detected." All six regressed benchmarks (simple_https, with_userinfo, complex_with_port_and_path, ftp_scheme, from_status_code_404, from_status_code_500) live in unrelated modules — the only files this branch touches under src-tauri/ are the container import handler, the plugin loader port + Extism adapter, and the IPC bridge. Treating this as runner variance and acknowledging on CodSpeed.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 458ef4d4be

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread src/views/LinkGrabberView/LinkGrabberView.tsx Outdated
Comment thread src/views/LinkGrabberView/LinkGrabberView.tsx
`link_check_online` and `link_detect_duplicates` both reject batches
above 500 URLs. Container imports above that threshold previously sent
a single oversized call, which the backend rejected and tripped the
`onError` fallback for every row — leaving large imports unable to
bulk-start (status stuck on `unknown`) and bypassing the duplicate gate.
Both helpers now slice into 500-URL chunks; each chunk's onError path
operates on its own `inFlightChunk` set so a failed chunk does not
overwrite a sibling chunk's resolved duplicate state.
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 0e6c089dda

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread src/views/LinkGrabberView/LinkGrabberView.tsx Outdated
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@src/views/LinkGrabberView/LinkGrabberView.tsx`:
- Around line 216-276: handleContainerFiles performs container import/resolve
outside the resolveLinks mutation state so overlapping drops can race; before
starting, guard against concurrent runs by checking the existing isResolving
flag used by resolveLinks and bail if true, and while processing set a shared
"resolving" state (reuse the resolveLinks mutation state if possible or set a
component-level isResolvingContainerImport boolean) to true, then set it back to
false in a finally block; ensure both the short-path resolveLinks({ urls }) call
and the chunked invoke("link_resolve", ...) flow respect this flag and that
applyResolvedBatch(merged) only runs when the flag is active, preventing a
second drop from overwriting resolvedLinks.
- Around line 141-173: The onError fallback in applyResolvedBatch can clobber
newer batches because it unconditionally calls setManyLinkStatuses for the
chunk; to fix, introduce a simple batch guard: add a local batchId (e.g.,
incrementing a ref or state like currentResolveBatchId) inside
applyResolvedBatch when it runs, capture that id in the closure passed to
checkLinksOnline, and in the onError callback only call
setManyLinkStatuses(chunk.map(...)) if the captured batchId ===
currentResolveBatchId; update applyResolvedBatch (and related helpers) to use
currentResolveBatchId when starting a batch so stale onError handlers cannot
overwrite statuses from newer batches.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: bf068a00-e6f0-443d-af30-3c9180acb9da

📥 Commits

Reviewing files that changed from the base of the PR and between 458ef4d and 0e6c089.

📒 Files selected for processing (1)
  • src/views/LinkGrabberView/LinkGrabberView.tsx

Comment thread src/views/LinkGrabberView/LinkGrabberView.tsx
Comment thread src/views/LinkGrabberView/LinkGrabberView.tsx
Three race conditions surfaced in the chunked container import path:

1. The chunked `applyResolvedBatch(merged)` call ran unconditionally
   after a long async loop, so a fresh paste-resolve started while
   chunks were in flight could be overwritten when the loop completed.
   Add `resolveBatchRef`: `applyResolvedBatch` increments it on every
   call, and the chunked container handler reserves an id before the
   loop and bails if a newer batch landed in the meantime.

2. The per-chunk `link_check_online` `onError` fallback marked rows
   `unknown` with no batch guard; an older batch erroring after a
   newer batch had pre-seeded `checking` could clobber the live
   statuses. Gate the fallback on the captured `batchId`.

3. Overlapping container drops both ran to completion since the
   handler lived outside the resolve mutation. Add an
   `isImportingContainers` state, bail immediately on a re-entrant
   call, and feed the flag into `PasteZone`'s `isLoading` so the
   Analyze button reflects that an import is still resolving.
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 8a8c7d12ea

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread src/views/LinkGrabberView/LinkGrabberView.tsx
`containerImportFailed` translation only carried `{{fileName}}`, so the
`defaultValue` carrying `String(err)` was discarded whenever the key
resolved (always, in practice). Users lost actionable diagnostics —
the IPC layer's "Install vortex-mod-containers..." hint never reached
the surface. Add a `{{reason}}` interpolation to en + fr, pass the
backend message (or a translated `containerTooLarge` for the local
size cap) through the toast call.
@mpiton mpiton merged commit 131f203 into main May 7, 2026
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation frontend rust

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant